Current Issue : July - September Volume : 2018 Issue Number : 3 Articles : 5 Articles
The Continuous Wavelet Transform (CWT) is an important mathematical tool in signal\nprocessing, which is a linear time-invariant operator with causality and stability for a fixed scale and\nreal-life application. A novel and simple proof of the FFT-based fast method of linear convolution is\npresented by exploiting the structures of circulant matrix. After introducing Equivalent Condition\nof Time-domain and Frequency-domain Algorithms of CWT, a class of algorithms for continuous\nwavelet transform are proposed and analyzed in this paper, which can cover the algorithms in\nJLAB andWaveLab, as well as the other existing methods such as the cwt function in the toolbox of\nMATLAB. In this framework, two theoretical issues for the computation of CWT are analyzed. Firstly,\nedge effect is easily handled by using Equivalent Condition of Time-domain and Frequency-domain\nAlgorithms of CWT and higher precision is expected. Secondly, due to the fact that linear convolution\nexpands the support of the signal, which parts of the linear convolution are just the coefficients of\nCWT is analyzed by exploring the relationship of the filters of Frequency-domain and Time-domain\nalgorithms, and some generalizations are given. Numerical experiments are presented to further\ndemonstrate our analyses....
Cross-domain collaborative filtering (CDCF) solves the sparsity problem by transferring rating knowledge fromauxiliary domains.\nObviously, different auxiliary domains have different importance to the target domain. However, previous works cannot evaluate\neffectively the significance of different auxiliary domains. To overcome this drawback, we propose a cross-domain collaborative\nfiltering algorithm based on Feature Construction and LocallyWeighted Linear Regression (FCLWLR).We first construct features\nin different domains and use these features to represent different auxiliary domains.Thus the weight computation across different\ndomains can be converted as the weight computation across different features. Then we combine the features in the target domain\nand in the auxiliary domains together and convert the cross-domain recommendation problem into a regression problem. Finally,\nwe employ a Locally Weighted Linear Regression (LWLR) model to solve the regression problem. As LWLR is a nonparametric\nregression method, it can effectively avoid underfitting or overfitting problem occurring in parametric regression methods. We\nconduct extensive experiments to show that the proposed FCLWLR algorithm is effective in addressing the data sparsity problem\nby transferring the useful knowledge from the auxiliary domains, as compared to many state-of-the-art single-domain or crossdomain\nCF methods....
Distributed Compressed Sensing (DCS) is an important research area of compressed sensing (CS). This paper aims at solving the\nDistributed Compressed Sensing (DCS) problem based on mixed support model. In solving this problem, the previous proposed\ngreedy pursuit algorithms easily fall into suboptimal solutions. In this paper, an intelligent grey wolf optimizer (GWO) algorithm\ncalled DCS-GWO is proposed by combining GWO and ...
As an important data analysis method in data mining, clustering analysis has been researched extensively and in depth. Aiming at\nthe limitation of ...
The existence of strongly polynomial algorithm for linear programming (LP)\nhas been widely sought after for decades. Recently, a new approach called\nGravity Sliding algorithm [1] has emerged. It is a gradient descending method\nwhereby the descending trajectory slides along the inner surfaces of a polyhedron\nuntil it reaches the optimal point. In R3, a water droplet pulled by gravitational\nforce traces the shortest path to descend to the lowest point. As the\nGravity Sliding algorithm emulates the water droplet trajectory, it exhibits\nstrongly polynomial behavior in R3. We believe that it could be a strongly polynomial\nalgorithm for linear programming in Rn too. In fact, our algorithm\ncan solve the Klee-Minty deformed cube problem in only two iterations, irrespective\nof the dimension of the cube. The core of gravity sliding algorithm\nis how to calculate the projection of the gravity vector g onto the intersection\nof a group of facets, which is disclosed in the same paper [1]. In this paper, we\nintroduce a more efficient method to compute the gradient projections on\ncomplementary facets, and rename it the Sliding Gradient algorithm under\nthe new projection calculation....
Loading....